Expected Posterior Priors in Factor Analysis

نویسنده

  • Hedibert Freitas Lopes
چکیده

Bayesian inference in factor analytic models has received renewed attention in recent years, partly due to computational advances but also partly to applied focuses generating factor structures as exemplified by recent work in financial time series modeling. The focus of our current work is to investigate the commonly overlooked problem of prior specification and sensitivity in factor models. We accomplish that by implementing Pérez and Berger’s (2002) Expected Posterior (EP) prior distributions. As opposed to alternative objective priors, such as Jeffreys’ prior and Bernardo’s prior, EP prior has several important theoretical and practical properties, with its straightforward computation through MCMC methods and coherence when comparing multiple models perhaps the most important ones.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Power-Expected-Posterior Priors for Variable Selection in Gaussian Linear Models

Imaginary training samples are often used in Bayesian statistics to develop prior distributions, with appealing interpretations, for use in model comparison. Expected-posterior priors are defined via imaginary training samples coming from a common underlying predictive distribution m, using an initial baseline prior distribution. These priors can have subjective and also default Bayesian implem...

متن کامل

IMAGE SEGMENTATION USING GAUSSIAN MIXTURE MODEL

  Stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. In this paper, we have learned Gaussian mixture model to the pixels of an image. The parameters of the model have estimated by EM-algorithm.   In addition pixel labeling corresponded to each pixel of true image is made by Bayes rule. In fact, ...

متن کامل

­­Image Segmentation using Gaussian Mixture Model

Abstract: Stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. In this paper, we used Gaussian mixture model to the pixels of an image. The parameters of the model were estimated by EM-algorithm.   In addition pixel labeling corresponded to each pixel of true image was made by Bayes rule. In fact,...

متن کامل

Bayesian model comparison based on expected posterior priors for discrete decomposable graphical models

The implementation of the Bayesian paradigm to model comparison can be problematic. In particular, prior distributions on the parameter space of each candidate model require special care. While it is well known that improper priors cannot be used routinely for Bayesian model comparison, we claim that in general the use of conventional priors (proper or improper) for model comparison should be r...

متن کامل

Intrinsic Priors for Testing Ordered Exponential Means

In Bayesian model selection or testing problems, Bayes factors under proper priors have been very successful. In practice, however, limited information and time constraints often require us to use noninformative priors which are typically improper and are deened only up to arbitrary constants. The resulting Bayes factors are then not well deened. A recently proposed model selection criterion, t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003